EN FR
EN FR
GALEN - 2016
Overall Objectives
Bibliography
Overall Objectives
Bibliography


Section: New Results

Learning with Non-modular loss functions

Paticipants: Jiaqian Yu, Matthew Blaschko

We have proposed an alternating direction method of multipliers (ADMM) based decomposition method loss augmented inference, that only depends on two individual solvers for the loss function term and for the inference term as two independent subproblems. In this way, we can gain computational efficiency and achieve more flexibility in choosing our non-modular loss functions of interest. We have proposed a novel supermodular loss function that empirically achieved better performance on the boundary of the objects, finding elongated structure [33]. We also introduced a novel convex surrogate operator for general non-modular loss functions, which provides for the first time a tractable solution for loss functions that are neither supermodular nor submodular, e.g. Dice loss. This surrogate is based on a canonical submodular-supermodular decomposition for which we have demonstrated its existence and uniqueness. It is further proven that this surrogate is convex, piecewise linear, an extension of the loss function, and for which subgradient computation is polynomial time [32][31].